Linear dimensionality reduction using relevance weighted LDA
نویسندگان
چکیده
The linear discriminant analysis (LDA) is one of the most traditional linear dimensionality reduction methods. This paper incorporates the inter-class relationships as relevance weights into the estimation of the overall within-class scatter matrix in order to improve the performance of the basic LDA method and some of its improved variants. We demonstrate that in some specific situations the standard multi-class LDA almost totally fails to find a discriminative subspace if the proposed relevance weights are not incorporated. In order to estimate the relevance weights of individual within-class scatter matrices, we propose several methods of which one employs the evolution strategies. 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
منابع مشابه
Weighted Generalized LDA for Undersampled Problems
Linear discriminant analysis (LDA) is a classical approach for dimensionality reduction. It aims to maximize between-class scatter and minimize within-class scatter, thus maximize the class discriminant. However, for undersampled problems where the data dimensionality is larger than the sample size, all scatter matrices are singular and the classical LDA encounters computational difficulty. Rec...
متن کامل2D Dimensionality Reduction Methods without Loss
In this paper, several two-dimensional extensions of principal component analysis (PCA) and linear discriminant analysis (LDA) techniques has been applied in a lossless dimensionality reduction framework, for face recognition application. In this framework, the benefits of dimensionality reduction were used to improve the performance of its predictive model, which was a support vector machine (...
متن کاملDiscriminant Analysis for Dimensionality Reduction: An Overview of Recent Developments
Many biometric applications such as face recognition involve data with a large number of features [1–3]. Analysis of such data is challenging due to the curse-ofdimensionality [4, 5], which states that an enormous number of samples are required to perform accurate predictions on problems with a high dimensionality. Dimensionality reduction, which extracts a small number of features by removing ...
متن کاملMulticlass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
ÐWe derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidian distance of the respective class means. We generalize upon LDA by introducing a dif...
متن کاملGeneralized Kernel Discriminant Analysis using Weighting Function with Applications to Feature Extraction
Linear discriminant analysis (LDA) is a classical approach for dimensionality reduction. However, LDA has shortcomings in that one of the scatter matrices is required to be nonsingular and the nonlinearly clustered structure is not easily captured, moreover, the adverse effects due to outlier classes also affect the performance of LDA. In order to solve these problems, in this paper, several no...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Pattern Recognition
دوره 38 شماره
صفحات -
تاریخ انتشار 2005